This article originally appeared in The Bar Examiner print edition, Fall 2019 (Vol. 88, No. 3), pp 4–6.
By Judith A. Gundersen
What We Did During Our Summer Non-Vacation
In keeping with the familiar back-to-school essays grade-school students around the country write about what they did during their summer vacation, I thought I’d write an essay about NCBE’s summer non-vacation to let you know what kept NCBE staff and volunteers busy. While most of us did manage to take vacations, we worked diligently all summer on a number of initiatives and projects that will lay the groundwork for transformative change in the world of bar admissions.
July 2019 Exam Results
Before writing about all the activities we are engaged in, I’d be remiss not to mention the July bar exam—always the axis around which our summer activities revolve. And when I write “our,” I mean not only NCBE’s testing departments, but also bar admissions staff around the country. Summer vacations for those in bar admissions always take into account the July bar exam. At NCBE, staff members carefully coordinate days off and Fourth of July getaways with exam production, shipment, grading, and scoring schedules in mind. After 17 years of working in the Test Operations Department, the July bar exam remains for me an indelible summer rite that combines excitement, anticipation, and a lot of work. And of course, the heavy lifting of administering the exam is in the capable hands of bar admissions staff. For the examinees themselves, summer is a time of intense studying and preparation, following graduation from the law schools dedicated to preparing them for the practice of law.
The good news this July is that for many examinees, that intense preparation paid off. As we already reported in our September press release, the MBE mean at 141.1 was up 1.6 points over July 2018, the highest increase over the previous July’s mean since July 2008. That will portend higher overall scores and higher pass rates in most jurisdictions. Indeed, this is already proving to be true so far as jurisdictions begin releasing their scores. And the mean for likely first-time takers was up 2.2 points from July 2018—the highest first-time taker value since 2013. See the Testing Column for more details about the July 2019 MBE results.
MPRE Transition to Computer-Based Testing
But the bar exam was only some of what we worked on this summer (this is a long essay). Dozens of our staff members worked all summer—actually they’ve been working for the past two years—to launch the inaugural computer-based administration of the MPRE. In August, over 4,000 candidates took the MPRE on computers at Pearson VUE centers as part of the exam’s migration from a paper-based to a computer-based testing format. (See the MPRE Inaugural Computer-Based Administration and the Testing Column for more details.) That inaugural administration was a success both from the candidate perspective (per the generally positive responses gathered through our post-exam survey) and from the results of our internal testing and score reporting systems.
Why are we are making the transition to computer-based testing? There are a number of reasons, including enhanced test security, more uniform testing conditions, and the ability to increase the number of pretest questions on each exam administration, which will benefit the test development process (all MPRE questions are pretested as unscored items to evaluate their performance for use on a future exam). These considerations are critical to a fair and valid test, which is important for all candidates on a high-stakes test like the MPRE. The Pearson VUE centers will also provide candidates with a testing environment that is free from distractions, has secure lockers to store personal items, and provides candidates with test navigation tools such as screen enlargement and a digital notepad, to name just a couple. By eliminating the paper exam, it is our aim to eventually produce test scores earlier. We are very aware of how important the candidate testing experience is and are committed to making this transition as smooth as possible. See more information on the benefits of computer-based testing.
Testing Task Force Practice Analysis
The Testing Task Force also met throughout the summer and, with the help of consultant ACS Ventures LLC, developed, field-tested, and deployed its nationwide practice analysis, representing Phase 2 of the Task Force’s study. A practice analysis is the basis for developing test blueprints and test design in a licensing context. While neither novel nor groundbreaking, a practice analysis is the established best practice when considering changes to a high-stakes licensing test. It is a necessary part of any testing program validation and/or redesign endeavor.
The Task Force’s practice analysis, which ran from August 1 through late September, was open to licensed lawyers who had at any point in their careers supervised newly licensed lawyers. We have collected a wealth of demographic information—such as practice area, firm size, geographic area, gender, race, and ethnicity—and are in the process of slicing and dicing (technical terms!) all the data we have received. The data will give us more information about how the practice of law for newly licensed lawyers differs, for instance, across regions and among different kinds of firms. That data will be made available as the study progresses.
Participation in the survey was a success thanks to many of you reading this issue of the Bar Examiner—you know who you are: administrators, Court personnel, bar examiners, and disciplinary counsel office staff members, to name but a few. Thank you so very much for your assistance in helping us distribute the survey. Reaching such a broad swath of American lawyers would not have been possible without you.
The Task Force has now begun to move into Phase 3 of its study—bar exam program design and test component design. Stay updated on this important phase of the study by subscribing to Task Force website updates at www.testingtaskforce.org/subscribe and by reading the quarterly Task Force updates in each issue of the Bar Examiner.
International Conference of Legal Regulators
In September, I attended a meeting of legal regulators from the United States and several other countries in Edinburgh, Scotland. It was an opportunity for me to learn about other countries’ approaches to lawyer discipline and admissions. And as you may know, several licensing entities are in the midst of reform. A fellow panelist, Paul Philip, Chief Executive of the Solicitors Regulation Authority (SRA)—the regulatory body for solicitors in England and Wales—spoke of the SRA’s effort to reform the path to solicitor licensure with the new Solicitors Qualifying Examination (SQE), which is set to debut in 2021.
Other speakers, especially on the discipline side, underscored the trend for disciplinary authorities to engage in risk-based regulation (or risk-informed regulation)—analysis of the potential risks to the public from lawyer misconduct and the implementation of measures proportionate to the risks posed. That risk-based approach made me think about how character and fitness investigations are conducted in the United States and whether we are going about them in the most efficient and effective way to protect the public. As usual, we will have modules devoted to character and fitness issues at our Annual Bar Admissions Conference next spring.
Academic Support Conference
Our Meetings and Education Department and testing departments also planned and presented our first academic support conference in six years. The conference, “Unpacking the Bar Exam,” was held in Madison on September 16–18. Sixty-five academic support professionals from 62 schools attended the conference and proved to be engaged and active participants. We enjoyed very much making connections with this important group of educators, many of whom spoke of the challenges within their schools, such as getting the resources needed to assist them in helping all students do their best on the bar exam. That their deans sent them to Madison for this conference was encouraging.
New Director of Assessment Design and Delivery
Keeping with the back-to-school theme, the “new kid” at NCBE is a renowned expert in test design, practice analysis, applied psychometrics, research methods, and program evaluation. Dr. Mark Raymond, most recently Research Director and Principal Assessment Scientist at the National Board of Medical Examiners, started at NCBE in late September. He brings a wealth of experience and will help move us in the direction in which the Testing Task Force and our stakeholders guide the future of the bar exam, among other assessment initiatives we are exploring. We couldn’t be more excited to have him as our colleague and resident expert.
Welcome to NCBE’s New Chair of the Board of Trustees
I’ll close by stating that it is my distinct pleasure to work with Judge Cynthia L. Martin in her year as NCBE Board chair. Cindy sits on the Missouri Court of Appeals in Kansas City and has been involved with NCBE for over 10 years. She is a natural leader, a hard worker, and an innovator. As chair of NCBE’s Testing Task Force, she does a lot of heavy lifting on that momentous study—by itself a full-time job. That she manages her volunteer role at NCBE in addition to her very full-time day job is a testament to her dedication, intelligence, and what appears to be an endless supply of energy. It’s already been a very productive year in large part due to Cindy’s vision for NCBE and her tireless approach to serving the organization.
Fall is a transition time, and it is likewise a time of transition here at NCBE. There is a lot on the horizon, and we are very excited about our focus on 2020 and beyond.
Until the next issue,
Judith A. Gundersen
Contact us to request a pdf file of the original article as it appeared in the print edition.